Graph-based regularization for regression problems with highly-correlated designs

نویسندگان

  • Yuan Li
  • Garvesh Raskutti
  • Rebecca Willett
چکیده

Abstract: Sparse models for high-dimensional linear regression and machine learning have received substantial attention over the past two decades. Model selection, or determining which features or covariates are the best explanatory variables, is critical to the interpretability of a learned model. Much of the current literature assumes that covariates are only mildly correlated. However, in modern applications ranging from functional MRI to genome-wide association studies, covariates are highly correlated and do not exhibit key properties (such as the restricted eigenvalue condition, RIP, or other related assumptions). This paper considers a high-dimensional regression setting in which a graph governs both correlations among the covariates and the similarity among regression coefficients. Using side information about the strength of correlations among features, we form a graph with edge weights corresponding to pairwise covariances. This graph is used to define a graph total variation regularizer that promotes similar weights for highly correlated features. The graph structure encapsulated by this regularizer helps precondition correlated features to yield provably accurate estimates. Graph-based similarity measures have led to successful regularization in the cases of the fused LASSO, edge LASSO, and graph trend filtering; however, using graph-based regularizers to develop theoretical guarantees for highly-correlated covariates has not been previously examined. This paper shows how our proposed graph-based regularization yields mean-squared error guarantees for a broad range of covariance graph structures and correlation strengths which in many cases are optimal by imposing additional structure on β which encourages alignment with the covariance graph. Our proposed approach outperforms other state-of-the-art methods for highlycorrelated design in a variety of experiments on simulated and real fMRI data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inverse Problems in Engineering Selection of Multiple Regularization Parameters in Local Ridge Regression Using Evolutionary Algorithms and Prediction Risk Optimization

This paper presents a new methodology for regularizing data-based predictive models. Traditional modeling using regression can produce unrepeatable, unstable, or noisy predictions when the inputs are highly correlated. Ridge regression is a regularization technique used to deal with those problems. A drawback of ridge regression is that it optimizes a single regularization parameter while the m...

متن کامل

Bayesian Regularization via Graph Laplacian

Regularization plays a critical role in modern statistical research, especially in high dimensional variable selection problems. Existing Bayesian methods usually assume independence between variables a priori. In this article, we propose a novel Bayesian approach, which explicitly models the dependence structure through a graph Laplacian matrix. We also generalize the graph Laplacian to allow ...

متن کامل

Semi-Supervised Regression using Spectral Techniques∗

Graph-based approaches for semi-supervised learning have received increasing amount of interest in recent years. Despite their good performance, many pure graph based algorithms do not have explicit functions and can not predict the label of unseen data. Graph regularization is a recently proposed framework which incorporates the intrinsic geometrical structure as a regularization term. It can ...

متن کامل

Robust Classification of Graph-Based Data

A graph-based classification method is proposed both for semi-supervised learning in the case of Euclidean data and for classification in the case of graph data. Our manifold learning technique is based on a convex optimization problem involving a convex regularization term and a concave loss function with a trade-off parameter carefully chosen so that the objective function remains convex. As ...

متن کامل

Simultaneous support recovery in high dimensions: Benefits and perils of block l1/l∞-regularization

Given a collection of r ≥ 2 linear regression problems in p dimensions, suppose that the regression coefficients share partially common supports of size at most s. This set-up suggests the use of l1/l∞-regularized regression for joint estimation of the p × r matrix of regression coefficients. We analyze the high-dimensional scaling of l1/l∞-regularized quadratic programming, considering both co...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018